1 trillion parameter MoE Flash News List | Blockchain.News
Flash News List

List of Flash News about 1 trillion parameter MoE

Time Details
2025-10-20
17:12
Alibaba Expands Qwen3: Qwen3-Max 1T MoE Pricing, Qwen3-VL-235B-A22B Benchmarks, and Qwen3-Omni-30B Audio SOTA — Key Numbers for Traders

According to @DeepLearningAI, Alibaba added Qwen3-Max, a closed-weights mixture-of-experts model with about 1 trillion parameters, supporting 262k-token inputs and API pricing starting at approximately $1.20 per 1 million input tokens and $6.00 per 1 million output tokens, providing clear unit economics for AI workloads; source: DeepLearning.AI. According to @DeepLearningAI, Alibaba released Qwen3-VL-235B-A22B, an open-weights vision-language model for text, image, and video with context windows ranging from 262k up to 1 million tokens, and it reports top performance on many image, video, and agentic benchmarks; source: DeepLearning.AI. According to @DeepLearningAI, Alibaba also introduced Qwen3-Omni-30B-A3B, an open-weights multimodal voice model that achieves state of the art on 22 of 36 audio and audio-visual tests, highlighting progress in speech-centric workloads; source: DeepLearning.AI.

Source